Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Neural Netw ; 169: 431-441, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-37931474

RESUMEN

Multi-dimensional data are common in many applications, such as videos and multi-variate time series. While tensor decomposition (TD) provides promising tools for analyzing such data, there still remains several limitations. First, traditional TDs assume multi-linear structures of the latent embeddings, which greatly limits their expressive power. Second, TDs cannot be straightforwardly applied to datasets with massive samples. To address these issues, we propose a nonparametric TD with amortized inference networks. Specifically, we establish a non-linear extension of tensor ring decomposition, using neural networks, to model complex latent structures. To jointly model the cross-sample correlations and physical structures, a matrix Gaussian process (GP) prior is imposed over the core tensors. From learning perspective, we develop a VAE-like amortized inference network to infer the posterior of core tensors corresponding to new tensor data, which enables TDs to be applied to large datasets. Our model can be also viewed as a kind of decomposition of VAE, which can additionally capture hidden tensor structure and enhance the expressiveness power. Finally, we derive an evidence lower bound such that a scalable optimization algorithm is developed. The advantages of our method have been evaluated extensively by data imputation on the Healing MNIST dataset and four multi-variate time series data.


Asunto(s)
Algoritmos , Aprendizaje , Redes Neurales de la Computación , Distribución Normal , Factores de Tiempo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...